Evaluation and Policy Analysis

The D.A.R.E. Program

What do we know about the D.A.R.E. Program?

Program Evaluation Results

RTI’s Evaluation Results

(images: wikipedia)

A Brief History of Evaluation Research



(images: City of Columbus)

Joint Committee’s Standards of Evaluation

4 Features all evaluations should have:

  • Utility - ensures that an evaluation will serve the practical information needs of intended users.

  • Feasibility - ensures that all evaluation will be realistic, prudent, diplomatic, and frugality

  • Propriety - ensures that an evaluation will be conducted legally, ethically, and with due regard for the welfare of those involved in the evaluation

  • Accuracy - ensures that an evaluation will reveal and convey technically adequate information about the features that determine worth of the program

Evaluation Basics

Evaluation Alternatives

All evaluation is empirical and data driven. Objective and empitical assemnets of policies and programs are the corner stone of the evaluation field.


Evaluation of need (Is the program needed?)

Evaluability assessment (Can the program be evaluated?)

Evaluation process (How does the program operate?)

Evaluation of impact (What is the program’s impact?)

Evaluation of efficiency (How efficient is the program?)

Evaluation Alternatives


Impact Evaluation - Analysis of the extent to which a treatment or another service has an effect


Efficiency Analysis - a type of analysis that compares program cost, with program effects.

1.) Cost-benefit analysis - Compares program cost with the economic value of the program 


2.) Cost effectivness analysis - comapres cost with actual program outcomes.

Case Study

Cost-Benefit Analysis of Therapeutic Communities

Sacks et al., (2002) conducted a cost-benefit analysis of modified therapeutic communities (TC).

(images: ENSO Recovery)

Design Decisions

Black Box Evaluations or Program Theory

Black Box Evaluations

  • The focus is on whether cases seem to have changed as a result of the exposure to the program.

(images: SlidePlayer)

Design Decisions

Black Box Evaluations or Program Theory

Program Theory

  • Describes what has been learned about how the program is effective.

(images: giphy)

Design Decisions

Researcher or Stakehold Orientation

Stakeholder Approach

  • Encourages researchers to be responsive to program stakeholders.

  • Issues for study are centered on the views of people involved with the program, and reports are made for the participants.

  • Utilization-focused evaluation the evaluator forms a task force that help to shape the evaluation project so they are more likely to get successful results.

Design Decisions

Researcher or Stakehold Orientation

Social Science Approach

  • Emphasizes the importance of researcher expertise and maintenance of some autonmy to develop the most trustworthy unbiased program evaluation.

Integrated Approaches

  • Attempt to cover issues of conern to both stakeholders and evaluators.

Evaluation in Action

Case-Study

Problem-Oriented Policing in Violent Crime Areas

Braga et al., (1999)

Randomized experimental design study used to evaluate Problem-Oriented Policing

Strength of Randomized Experimental Designs in Impact Evaluations


Braga et al., (1999) used a true experimental design.


  • What are the three elements of experimental design?


  • Are these finding generalizable? In other words, can we apply them to a larger population?

Qualitative and Quantitative Methods


Traditionally, evaluation methods are quantitative.


Although, qualitative methods can often offer more depth in understanding the program effectiveness. For example, figuring out what inside ‘the black box.’ This can be completed with intense interviewing of staff or clients.


Usually, the more complex the program, the more effective qualitative methods are.


Increasing Demand for Evidence-Based Policy


Increasing Demand for Evidence-Based Policy


The Campbell Collaboration

  • An international research network

  • Purpose is to prepare and disseminate systamic reviews of social science evidence in three fields - criminal justice, education, and social welfare.

  • For more information click here

Ethics


These programs directly affect people’s lives. Therefore we must be honest and transparent with how the studies are conducted.


Furthermore, letting programs continue that have no evidence of working (e.g., the D.A.R.E. program) takes money away from other programs that may benefit people’s lives.


This is why policy and evaluation research is so important.

Thanks for coming! Have a good weekend!


(images: giphy)